Market Roundup October 10, 2003 IBM Announces the Second Movement of
Project Symphony |
|
IBM Announces the Second Movement of
Project Symphony
IBM has announced a new computing system called IBM Web
Infrastructure Orchestration, designed to respond dynamically to Web capacity
needs and utilization issues. This product bundle, based on the IBM eServer BladeCenter, consists of pre-integrated versions of
WebSphere, DB2, Tivoli Storage Manager, Tivoli Monitoring, and TotalStorage hardware, coordinated by defined business
policies and managed by Tivoli’s Intelligent ThinkDynamic
Orchestrator. Prices for the base Tivoli Intelligent ThinkDynamic Orchestrator software
start at $20,000 and cost for the bundle of combined hardware, software, and
services depends on the configuration requirements. IBM also announced BladeCenter Standby Capacity on Demand, an on-demand
offering of a pre-integrated BladeCenter chassis
featuring seven blades standard and seven blades as "standby
capacity" for activation as additional blade capacity is needed.
This is the second product to result from Project Symphony,
the code-named automation/optimization effort focusing on the marriage of IBM's
previous automation technology effort eLiza,
with virtualization technology that IBM acquired through Think Dynamics. This
effort to model the systems resources, determine optimized responses, and craft
confident results is non-trivial. Fortunately for IBM, Project Symphony is
being helped up the learning curve by the early results of the seven-year
contract with J.P. Morgan announced at year end 2002. By leveraging the initial
results of that effort, IBM was able to transpose many proprietary methods of
securing, partitioning, and optimizing the data into the emerging product mix
that reflects Project Symphony’s goal of developing the premiere server
optimization solution. This is as critical for vendors as it is for customers.
Customers, continually sweating the costs, must recover the 60% to 80% capacity
underutilized in their server environments today as well as prepare to cope
with very uncertain futures. Vendors, facing both changing business models and
markets must respond with products that lower customer’s costs and promote
customer confidence. The industry in general, and IBM, HP, and Sun in
particular, reflects the awareness and now product commitment that utility
computing, by whatever name it may be promoted, is just such a driver.
We believe this announcement reflects a solid commercial offering that features the experience and integration of a range of IBM-branded ingredients packaged with a “recipe” tested in IBM Consulting’s test kitchen. There are the usual and somewhat credible disclaimers as to the bundle’s core value commitment to be “open” but depending on the usability and effectiveness of this product, most customers will likely take the default workflows and add their own best practices to the automated management of their Web server environment. The chord that this announcement struck for us is that once again IBM has successfully abstracted a very, very complex and demanding problem with a strong tactical product response in a well-defined and focused story. This is in direct opposition to the “boil the ocean” approaches other vendors have related. While IBM clearly has broad technical competition from the usual suspects, their commercially structured offerings and approach is music to our ears.
Microsoft Revamps Partners Program
In conjunction with its worldwide partner conference,
Microsoft has announced a new partner program that will commence January 2004.
The program will be phased in over the next eighteen months and its goal is to
build closer relationships between Microsoft and its partners with one major
new component of this effort being called the “competency framework.” The
purpose of this framework is to establish a partner’s level of expertise within
a given set of solution areas which include Advanced Infrastructure, Business
Intelligence, Information Worker Productivity, ISV/Software Solutions,
Learning, Licensing and Software Asset Management, Networking Infrastructure,
OEM Hardware, and Security. In addition, the new program offers three partner
levels: Registered Member, Microsoft Certified Partner, and Microsoft Gold
Certified Partner. Each level provides an incremental increase in benefits. A
partner’s program level will be determined through a points program wherein partners
earn points in a number of areas that include customer satisfaction, competency
achievements, specific Microsoft certifications; and new business wins through
migration to Microsoft products and upgrades, creating certified
Microsoft-based applications, Microsoft Learning Product sales, Microsoft
license distribution, and other eligible training and assessments to be
determined.
It’s a new era in IT and as such IT vendors must figure
out new ways to do business among a much more prudent set of customers. For
many IT vendors, this is no small challenge and in our mind partners are going
to play a key role in finding new ways to sell products to skittish and finicky
customers. However, as enterprises look to keep spending in check, vendors are
going to become more selective about how and with whom they spend their partner
dollars. They cannot afford to blindly throw money at partner programs, nor can
they let programs languish in half hearted efforts. In this highly competitive
market, partners are free to pick and choose who they do business with and may
choose a path with less resistance and more assistance. VARs may in fact hold
the upper hand these days, as in many cases they own the relationship with the
buyer.
Microsoft’s new partner program is not a one-size-fits-all approach. The program will stratify partners along areas of expertise as well as recognize their level of contribution through a point system. Of course the goal of any IT vendor sponsored program is to increase sales of that vendor’s products; however, at first blush this appears to be more of a Microsoft exclusive reseller rewards program, rather than one targeted at a larger more diverse partner community. Not surprisingly, the expertise segments map very closely to Microsoft’s own revised product segments rather than more general industry segments. The partner points appear to drive home Microsoft’s own business agenda. For example, points will be awarded for securing new business that involves migrating a customer to Microsoft solutions from other vendors as well as pushing customers to upgrade to new Microsoft versions. With Microsoft using its partners program to extend its reach and shore up its installed base, the program will appeal most to those who see the world through Microsoft-colored lenses. Others, who want to dip their toes in the water without drinking the Koolaid and still retain their identity along with their taste for a varied IT diet, might want to limit their relationship to the Registered Member level.
A federal judge in Minnesota this week issued a ruling
barring the state of Minnesota — and its Public Utilities Commission — from
issuing and enforcing rules similar to those applied to telephone companies
against Vonage, a voice-over IP provider of local and long-distance calling
services. The ruling means that the MPUC cannot require Vonage — or other VOIP
providers — to obtain a telephone operator’s license or to pay fees used to
support 911 emergency call services. Both Wyoming and California PUCs have also stated that they have jurisdiction and
regulatory oversight over VOIP providers. Also, lawsuits have been filed by
standard phone carriers against VOIP providers, arguing that they are not
paying the full cost necessary under various telephone industry regulations.
Looking into the core of this issue, it would appear to
revolve around a very simple question: What is the difference between voice
bits and data bits? Over many decades, the traditional telephone industry has
wrangled with courts and PUCs across the land, and in the process have hammered a wide range of
rules, regulations, and technical definitions of what is required of them and
what they can charge their customers. Into this very sharply defined regulatory
matrix comes along a highly disruptive technology — the Internet and the VOIP —
using new technology and applications of it to completely circumvent this
tangle of regulatory authority wholesale, at least for the time being.
There is no question that the Internet and its related technologies have forced well established industries to reassess their overall long-term strategies and begin a painful process of adaptation to new realities. Here, we see the telephone industry facing the early stages of what may become a tidal wave of changing user behavior similar to the one that has engulfed the music industry. Call them revolutionary forces, if you will. We believe it would be foolish to assume that the only entities forced to undergo change in the ongoing battle between disruptive new technologies and their more established adversaries will be the aging dinosaurs. Such battles extract tolls from both sides, and we believe that the VOIP battle has a significant potential to open the door to a new way of looking at the Internet and its related technologies fundamentally challenging the idea that the Internet is all about “free.” It is likely that we are fast approaching the time in which the Internet’s maturity will be marked by higher degrees of regulation and taxation. While this may dismay many people, it is important to note that powerful forces are at work in this arena. States like Minnesota, California, and Wyoming — among many if not all of the other states and other countries — are seeking new ways to capture regulatory licensing fees and tax revenues. These states face extraordinary budget crises and are making substantial service cuts to make up for revenue shortfalls. Will they continue to sit idly while billions of dollars of otherwise taxable economic activity goes on under their very noses? Probably not. We suspect that while the Internet will continue to disrupt many industries, it too will face a time when it will be altered by the changing circumstances surrounding it, thereby confirming its emergence as a truly mainstream technology phenomena.
New reports indicate that employee defections at Sun
Microsystems are continuing, with the apparent departure of nineteen senior
engineers from the N1 programming shop in Colorado. They left to join Cassatt, a start-up company founded by former Sun employee
and co-founder of BEA Systems Bill Coleman. N1 was Sun’s data center management
and virtualization effort. Cassatt is focused on
virtualization and autonomic computing. Also, according to press reports, Dr. Yousef Khalidi, chief technology
officer for N1 products, has left the company and is joining Microsoft.
These departures follow that of Bill Joy, who left Sun
recently, and Ed Zander, who left last year. This
apparent talent drain comes in the wake of Sun’s recent restatement of fourth
quarter results and a warning that revenues in the present quarter will be
flat. All in all, not a good month for Sun, and with the loss of the N1 team
members, things would not appear to be looking up anytime soon.
Sun’s N1 efforts — which are akin to HP’s Adaptive Enterprise and IBM’s On Demand computing initiatives — are key to Sun turning around its apparently flagging fortunes. All three companies recognize the importance of taking the next step in improving the management, resilience, scalability, and security of ever more complex data centers through the virtualization of hardware, which in essence can give dumb boxes some real intelligence. The entry of Cassatt into this field — with a rumored $50 million in venture backing — indicates that not only do industry veterans believe the market for these products is quite real, but that there is still an opportunity to make a real killing by offering state-of-the-art virtualization products that to a great degree manage themselves. With grid computing, blade servers, and the quietly persistent mainframes combining to offer greater hardware scalability, virtualization provides a way to effectively manage and logically partition CPUs, storage, and applications in the most efficient manner creating logical rather than physical computing environments. While we are not ready to say the departure of these employees from Sun is a fatal blow, the injury is more than a mere flesh wound. Effective virtualization products will become a key value add to commodity hardware products, and as such a key differentiator, one that will separate the men from the boys.
EMC and IBM: I’ll Show You
Mine if You’ll Show Me Yours
EMC and
IBM have announced an agreement to share APIs and improve interoperability and
compatibility between their storage systems. The goal of this program is to
make it easier for customers to use their respective products in heterogeneous
environments. The program includes an exchange of APIs for disk storage,
including those related to SMI-S (Storage Management Initiative Specification).
They have also put in place a framework for rapid escalation and resolution of
issues from joint installations, and EMC have purchased a license for Symmetrix of IBM’s interfaces for their TotalStorage
Enterprise Storage Server, which includes FlashCopy,
Multiple Allegiance, and Parallel Access Volumes.
This
truly is one of a long line of API swaps that have been occurring for the last
year and a half. While IBM was one of the last holdouts with EMC, it is
certainly in keeping with the need to make products interoperate. While
marketing from server and application companies has admitted the existence of
other products and demonstrated interoperability of some degree for years,
storage vendors have continued to operate as if in a vacuum. This of course has
done nothing to please customers. Fortunately, the market and the desire to
succeed with new virtualisation technologies have conspired to force these
companies to climb down out of their ivory towers and face the real world: customers have heterogeneous environments, and
storage is a hodgepodge resulting from various projects, infrastructure choices,
and company mergers. While most of these
companies have produced some variant of a virtualisation device, the ability to
support other companies’ products within those virtualised universes has been
severely limited. This mutual sharing of APIs and cross-licensing of technologies
should make life easier for customers who have no intention of tossing out kit
to humour their vendor-du-jour but who want to take
advantage of the future of networked storage.
This deal is also a win for both EMC and IBM. EMC remain the industry leader, and IBM need to be interoperable if they want to integrate their storage with EMC’s. At the same time, IBM are the leader in servers, and EMC will continue to sell significant storage in that space. Knowing that the companies are sharing technologies will give customers of every environmental flavour better security regarding their choices. At the end of the day, storage systems are strategic because they are responsible for the data, and while storage systems are responsible for the protection, availability, and movement of that data, they are not the most important part of the ecosystem: that title is usually reserved for the customer application. And that means that storage vendors are going to have to stop focusing on devices and continue to shift their focus to data manageability, reliability, and security, thus ensuring that they make life easier for their customers instead of building great walls to protect their device-oriented installed-base revenue streams.